A geometrical approach to Iterative Isotone Regression
نویسندگان
چکیده
منابع مشابه
A geometrical approach to Iterative Isotone Regression
In the present paper, we propose and analyze a novel method for estimating a univariate regression function of bounded variation. The underpinning idea is to combine two classical tools in nonparametric statistics, namely isotonic regression and the estimation of additive models. A geometrical interpretation enables us to link this iterative method with Von Neumann’s algorithm. Moreover, making...
متن کاملAdditive isotone regression
This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimated as well as it could be by a least squares estimator if the other components were known. The algorithm for the calculation of the estimator uses backfitting. Convergence of the algorith...
متن کاملA Geometrical Approach to Aggregation
Considering the family F of contour curves F = {h(x, y) = k) of an (idempotent) aggregation operator h in two variables as a oneparametric family of curves, the differential equation y′ = f(x, y) having F as general solution is associated to h. Properties of h have then be translated to properties of its differential equation. Reciprocally, for a differential equation fulfilling some easy prope...
متن کاملLASSO ISOtone for High Dimensional Additive Isotonic Regression
Additive isotonic regression attempts to determine the relationship between a multi-dimensional observation variable and a response, under the constraint that the estimate is the additive sum of univariate component effects that are monotonically increasing. In this article, we present a new method for such regression called LASSO Isotone (LISO). LISO adapts ideas from sparse linear modelling t...
متن کاملA New Function for Robust Linear Regression: An Iterative Approach
In this paper, we consider solving the robust linear regression problem. We show that IRLS and Newton method can each be combined with preconditioned conjugate gradient least squares method to solve large, sparse, rectangular systems of linear, algebraic equations eeciently. We deene a new function that leads to a cheap preconditioner. Further, for this function, we show that the upper bound on...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied Mathematics and Computation
سال: 2014
ISSN: 0096-3003
DOI: 10.1016/j.amc.2013.11.048